Testing the structure of the covariance matrix with fewer observations than the dimension
نویسندگان
چکیده
We consider two hypothesis testing problems with N independent observations on a single m-vector, when m > N , and the N observations on the random m-vector are independently and identically distributed as multivariate normal with mean vector μ and covariance matrix Σ, both unknown. In the first problem, the m-vector is partitioned into two subvectors of dimensions m1 and m2, respectively, and we propose two tests for the independence of the two sub-vectors that are valid as (m,N)→∞. The asymptotic distribution of the test statistics under the hypothesis of independence is shown to be standard normal, and the power examined by simulations. The proposed tests perform better than the likelihood ratio test, although the latter can only be used when m is smaller than N . The second problem addressed is that of testing the hypothesis that the covariance matrix Σ is of the intraclass correlation structure. A statistic for testing this is proposed, and assessed via simulations; again the proposed test statistic compares favorably with the likelihood ratio test.
منابع مشابه
Some tests for the covariance matrix with fewer observations than the dimension under non-normality
This article analyzes whether the existing tests for the p× p covariance matrix Σ of the N independent identically distributed observation vectors with N ≤ p work under non-normality. We focus on three hypotheses testing problems: (1) testing for sphericity, that is, the covariance matrix Σ is proportional to an identity matrix Ip; (2) the covariance matrix Σ is an identity matrix Ip; and (3) t...
متن کاملTests for covariance matrices in high dimension with less sample size
In this article, we propose tests for covariance matrices of high dimension with fewer observations than the dimension for a general class of distributions with positive definite covariance matrices. In one-sample case, tests are proposed for sphericity and for testing the hypothesis that the covariance matrix Σ is an identity matrix, by providing an unbiased estimator of tr [Σ] under the gener...
متن کاملAn Efficient Bayesian Optimal Design for Logistic Model
Consider a Bayesian optimal design with many support points which poses the problem of collecting data with a few number of observations at each design point. Under such a scenario the asymptotic property of using Fisher information matrix for approximating the covariance matrix of posterior ML estimators might be doubtful. We suggest to use Bhattcharyya matrix in deriving the information matri...
متن کاملGraph Matrix Completion in Presence of Outliers
Matrix completion problem has gathered a lot of attention in recent years. In the matrix completion problem, the goal is to recover a low-rank matrix from a subset of its entries. The graph matrix completion was introduced based on the fact that the relation between rows (or columns) of a matrix can be modeled as a graph structure. The graph matrix completion problem is formulated by adding the...
متن کاملTesting Some Covariance Structures under a Growth Curve Model in High Dimension
In this paper we consider the problem of testing (a) sphericity and (b) intraclass covariance structure under a Growth Curve model. The maximum likelihood estimator (MLE) for the mean in a Growth Curve model is a weighted estimator with the inverse of the sample covariance matrix which is unstable for large p close to N and singular for p larger than N . The MLE for the covariance matrix is bas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Multivariate Analysis
دوره 112 شماره
صفحات -
تاریخ انتشار 2012